549 research outputs found

    Distributed Lag Linear and Non-Linear Models in R: The Package dlnm

    Get PDF
    Distributed lag non-linear models (DLNMs) represent a modeling framework to flexibly describe associations showing potentially non-linear and delayed effects in time series data. This methodology rests on the definition of a crossbasis , a bi-dimensional functional space expressed by the combination of two sets of basis functions, which specify the relationships in the dimensions of predictor and lags, respectively. This framework is implemented in the R package dlnm, which provides functions to perform the broad range of models within the DLNM family and then to help interpret the results, with an emphasis on graphical representation. This paper offers an overview of the capabilities of the package, describing the conceptual and practical steps to specify and interpret DLNMs with an example of application to real data.

    Reducing and meta-analysing estimates from distributed lag non-linear models.

    Get PDF
    BACKGROUND: The two-stage time series design represents a powerful analytical tool in environmental epidemiology. Recently, models for both stages have been extended with the development of distributed lag non-linear models (DLNMs), a methodology for investigating simultaneously non-linear and lagged relationships, and multivariate meta-analysis, a methodology to pool estimates of multi-parameter associations. However, the application of both methods in two-stage analyses is prevented by the high-dimensional definition of DLNMs. METHODS: In this contribution we propose a method to synthesize DLNMs to simpler summaries, expressed by a reduced set of parameters of one-dimensional functions, which are compatible with current multivariate meta-analytical techniques. The methodology and modelling framework are implemented in R through the packages dlnm and mvmeta. RESULTS: As an illustrative application, the method is adopted for the two-stage time series analysis of temperature-mortality associations using data from 10 regions in England and Wales. R code and data are available as supplementary online material. DISCUSSION AND CONCLUSIONS: The methodology proposed here extends the use of DLNMs in two-stage analyses, obtaining meta-analytical estimates of easily interpretable summaries from complex non-linear and delayed associations. The approach relaxes the assumptions and avoids simplifications required by simpler modelling approaches

    The Excess Winter Deaths Measure: Why Its Use Is Misleading for Public Health Understanding of Cold-related Health Impacts.

    Get PDF
    BACKGROUND: Excess winter deaths, the ratio between average daily deaths in December-March versus other months, is a measure commonly used by public health practitioners and analysts to assess health burdens associated with wintertime weather. We seek to demonstrate that this measure is fundamentally biased and can lead to misleading conclusions about health impacts associated with current and future winter climate. METHODS: Time series regression analysis of 779,372 deaths from natural causes in London over 15 years (1 August 1997-31 July 2012),collapsed by day of death and linked to daily temperature values. The outcome measures were the excess winter deaths index, and daily and annual deaths attributable specifically to cold. RESULTS: Most of the excess winter deaths are driven by cold: The excess winter deaths index decreased from 1.19 to 1.07 after excluding deaths attributable to low temperatures. Over 40% of cold-attributable deaths occurred outside of the December-March period, leading to bias in the excess winter deaths measure. Although there was no relationship between winter severity and annual excess winter deaths, there was a clear correlation with annual cold-attributable deaths. CONCLUSIONS: Excess winter deaths is not an appropriate indicator of cold-related health impacts, and its use should be discontinued. We advocate alternative measures. The findings we present bring into doubt previous claims that cold-related deaths in the UK will not reduce in future as a result of climate change

    Attributable risk from distributed lag models.

    Get PDF
    BACKGROUND: Measures of attributable risk are an integral part of epidemiological analyses, particularly when aimed at the planning and evaluation of public health interventions. However, the current definition of such measures does not consider any temporal relationships between exposure and risk. In this contribution, we propose extended definitions of attributable risk within the framework of distributed lag non-linear models, an approach recently proposed for modelling delayed associations in either linear or non-linear exposure-response associations. METHODS: We classify versions of attributable number and fraction expressed using either a forward or backward perspective. The former specifies the future burden due to a given exposure event, while the latter summarizes the current burden due to the set of exposure events experienced in the past. In addition, we illustrate how the components related to sub-ranges of the exposure can be separated. RESULTS: We apply these methods for estimating the mortality risk attributable to outdoor temperature in two cities, London and Rome, using time series data for the periods 1993-2006 and 1992-2010, respectively. The analysis provides estimates of the overall mortality burden attributable to temperature, and then computes the components attributable to cold and heat and then mild and extreme temperatures. CONCLUSIONS: These extended definitions of attributable risk account for the additional temporal dimension which characterizes exposure-response associations, providing more appropriate attributable measures in the presence of dependencies characterized by potentially complex temporal patterns

    Brief Report: Investigating Uncertainty in the Minimum Mortality Temperature: Methods and Application to 52 Spanish Cities.

    Get PDF
    BACKGROUND: The minimum mortality temperature from J- or U-shaped curves varies across cities with different climates. This variation conveys information on adaptation, but ability to characterize is limited by the absence of a method to describe uncertainty in estimated minimum mortality temperatures. METHODS: We propose an approximate parametric bootstrap estimator of confidence interval (CI) and standard error (SE) for the minimum mortality temperature from a temperature-mortality shape estimated by splines. RESULTS: The coverage of the estimated CIs was close to nominal value (95%) in the datasets simulated, although SEs were slightly high. Applying the method to 52 Spanish provincial capital cities showed larger minimum mortality temperatures in hotter cities, rising almost exactly at the same rate as annual mean temperature. CONCLUSIONS: The method proposed for computing CIs and SEs for minimums from spline curves allows comparing minimum mortality temperatures in different cities and investigating their associations with climate properly, allowing for estimation uncertainty

    Ambient Air Pollution and Mortality in 652 Cities. Reply.

    Get PDF

    Interrupted time series regression for the evaluation of public health interventions: a tutorial.

    Get PDF
    Interrupted time series (ITS) analysis is a valuable study design for evaluating the effectiveness of population-level health interventions that have been implemented at a clearly defined point in time. It is increasingly being used to evaluate the effectiveness of interventions ranging from clinical therapy to national public health legislation. Whereas the design shares many properties of regression-based approaches in other epidemiological studies, there are a range of unique features of time series data that require additional methodological considerations. In this tutorial we use a worked example to demonstrate a robust approach to ITS analysis using segmented regression. We begin by describing the design and considering when ITS is an appropriate design choice. We then discuss the essential, yet often omitted, step of proposing the impact model a priori. Subsequently, we demonstrate the approach to statistical analysis including the main segmented regression model. Finally we describe the main methodological issues associated with ITS analysis: over-dispersion of time series data, autocorrelation, adjusting for seasonal trends and controlling for time-varying confounders, and we also outline some of the more complex design adaptations that can be used to strengthen the basic ITS design

    A tutorial on the case time series design for small-area analysis.

    Get PDF
    Background The increased availability of data on health outcomes and risk factors collected at fine geographical resolution is one of the main reasons for the rising popularity of epidemiological analyses conducted at small-area level. However, this rich data setting poses important methodological issues related to modelling complexities and computational demands, as well as the linkage and harmonisation of data collected at different geographical levels. Methods This tutorial illustrated the extension of the case time series design, originally proposed for individual-level analyses on short-term associations with time-varying exposures, for applications using data aggregated over small geographical areas. The case time series design embeds the longitudinal structure of time series data within the self-matched framework of case-only methods, offering a flexible and highly adaptable analytical tool. The methodology is well suited for modelling complex temporal relationships, and it provides an efficient computational scheme for large datasets including longitudinal measurements collected at a fine geographical level. Results The application of the case time series for small-area analyses is demonstrated using a real-data case study to assess the mortality risks associated with high temperature in the summers of 2006 and 2013 in London, UK. The example makes use of information on individual deaths, temperature, and socio-economic characteristics collected at different geographical levels. The tutorial describes the various steps of the analysis, namely the definition of the case time series structure and the linkage of the data, as well as the estimation of the risk associations and the assessment of vulnerability differences. R code and data are made available to fully reproduce the results and the graphical descriptions. Conclusions The extension of the case time series for small-area analysis offers a valuable analytical tool that combines modelling flexibility and computational efficiency. The increasing availability of data collected at fine geographical scales provides opportunities for its application to address a wide range of epidemiological questions

    Corrigendum to: Interrupted time series regression for the evaluation of public health interventions: a tutorial.

    Get PDF
    The originally published version of this article contained an algebraic definition of the regression model for interrupted time series (ITS) that could lead to erroneous interpretations of the estimated parameters. This model was presented in the equation at page 351, right column, and the following text. We provide here a more accurate definition
    • ā€¦
    corecore